12 research outputs found
Deplatforming, demotion and folk theories of Big Tech persecution
This article examines the moderation of conspiracy narratives surrounding COVID-19 through digital methods analysis of deplatformed or demoted videos. Building upon the literature on moderation, it performs a comparison of the types of content moderated by YouTube during the early stages of the pandemic. It seeks to determine the extent to which YouTube's own moderation actions are brought in as part of the conspiratorial narratives surrounding COVID-19, while investigating how it is that moderation becomes entangled with questions of truth and visibility.This article examines YouTubeâs moderation of conspiracy narratives surrounding COVID-19 through an analysis of deplatformed and demoted YouTube videos. Building upon the literature on moderation, it compares the types of content moderated by YouTube from April to October, 2020. In doing so, it seeks to determine the extent to which YouTube's own moderation actions are brought in as part of the conspiratorial narratives surrounding COVID-19, while investigating how it is that moderation becomes entangled with questions of truth and visibility.
Â
Keywords: content moderation, conspiracy theories, YouTub
Capturing Gendered Mobility and Street Use in the Historical City: A New Methodological Approach
While in the social sciences everyday mobility and street use are seen as central to the understanding of urban societies, in the work of historians these phenomena only play a limited role. Building upon methods from related fields and using digital tools, this article proposes a new methodological approach to study historical mobility and street use. This âsnapshot approachâ facilitates intercultural comparability and creates possibilities for systematic spatial analyses. We propose it forms an important tool to enhance our understanding of gendered urban experience in the past
From Text Mining to Visual Classification: Rethinking Computational New Cinema History with Jean Desmetâs Digitised Business Archive
Focusing on the specific case of cinema owner and film distributor Jean Desmetâs digitised business archive, this article discusses how computational approaches may facilitate the archiveâs unlocking for researchers in the Dutch national research infrastructure â CLARIAH âMedia Suite. To this end, the article considers previous computational approaches to film- related sources in New Cinema History research in a historical perspective, suggesting a novel approach which combines text mining and visual classification. The article argues that such a combination is necessary to yield results which reflect the archiveâs material heterogeneity and complexity, and that it offers a new direction for computational approaches in New Cinema History and their conceptualisations of film-related materials as historical sources
Pottery Goes Public. Performing Archaeological Research Amid the Audience
The project Pottery Goes Public explores the potential of 3D analytical tools to assess to what extent they can provide us with new interpretations and insights into the technological aspects of ancient pottery manufacturing. However, developing innovative 3D imaging techniques for ceramic analysis is not the only aim of the project. Since its inception, Pottery Goes Public has been designed to involve a wider audience not only into the study of ancient potting techniques, but also into the very process of carrying out the research. As advocated by the proponents of a reflexive approach to archaeology, in order to make the past relevant to contemporary society it is imperative for the archaeologist to include all interested parties into every stage of the analysis, from the formulation of the research questions to the dissemination of outputs. In this sense, the deployment of modern 3D technologies proved to be an indisputably powerful medium of communication and interaction with the public at large. Performing live archaeological research with cutting edge tools is a key step towards opening up academic research to multiple actors and actively engaging them with the archaeological interpretative process
On Altpedias: partisan epistemics in the encyclopaedias of alternative facts
This article considers how online alternative encyclopaedias, or âAltpediasâ, create and maintain their own universes of âalternative factsâ. We consider a selection of Altpedias that reject Wikipediaâs celebrated âneutral point of viewâ as an artefact of liberal consensus politics whilst regarding their own epistemics as inherently partisan. As opposed to disregarding objectivity or truth, Altpediasâ âalternative factsâ may thus be understood as the product of competing normative standpoints concerning the use value of knowledge. In competing with Wikipedia, Altpedias ultimately attempt to give their partisan viewpoints universal standards, both in tone and in their very nature as wiki platforms. Empirically, the article uses visual network analysis and natural language processing in order to represent the vernacular worldviews of several far- and extreme-right Altpedias: Metapedia, Infogalactic and Rightpedia. Theoretically, the article frames these Altpediasâ fractious approach to the study of knowledge in relation to Lyotardâs 'general agonisticâ and his speculations concerning the impact of computation on epistemics in the postmodern condition
Semantic Deep Mapping in the Amsterdam Time Machine:Viewing Late 19th- and Early 20th-Century Theatre and Cinema Culture Through the Lens of Language Use and Socio-Economic Status
In this paper, we present our work on semantic deep mapping at scale by combining information from various sources and disciplines to study historical Amsterdam. We model our data according to semantic web standards and ground them in space and time such that we can investigate what happened at a particular time and place from a linguistics, socio-economic and urban historical perspective. In a small use case we test the spatio-temporal infrastructure for research on entertainment culture in Amsterdam around the turn of the 20th century. We explain the bottlenecks we encountered for integrating information from different disciplines and sources and how we resolved or worked around them. Finally, we present a set of recommendations and best practices for adapting semantic deep mapping to other settings